In love with artificial intelligence: “The next big development is AI-generated porn”

AI can seduce us, says psychologist Jessica Szczuka. What this has to do with sexual fantasies – and why we need to worry about our children. An interview.
When the film "Her" was released in 2013, it caused a sensation. A single, unhappy man falls in love with the artificial intelligence Samantha, a voice that understands him, laughs with him, and with whom he even has sex. At the time, this was considered science fiction, unimaginable for many. Twelve years later, more and more people are reporting exactly such experiences: They were seduced by an AI. Today, there are dozens of AI systems that offer romantic relationships. In the USA, we can observe how deeply technology is already interfering with our psyche. Social media is filled with euphoric accounts of people who have found their "true love" in an AI, but also posts from desperate people whose only friend suddenly disappeared because the system performed an update overnight. The case of a 14-year-old who took his own life out of love for an AI chatbot is currently being heard in court.
Social psychologist Jessica Szczuka from the University of Duisburg has been studying digital intimacy for years and researching romantic relationships between humans and machines.
Ms. Szczuka, we're currently reading many astonished reports about people having relationships with AIs. Are you surprised by this development?
I've been studying how love and sexuality are changing through digitalization for ten years. For a long time, hardly anyone was interested in this. In my dissertation, I researched sexualized robots, and back then, people often said it was pure science fiction. Since major language models like ChatGPT came onto the market, this has changed completely. Suddenly, people are using these systems to fulfill social and sexual needs. It's no longer science fiction. For me, this was foreseeable.
For what reason?
Even in Greek mythology, in Ovid's Pygmalion, there is the idea of carving one's perfect lover in stone. That's exactly what we're seeing today: building the perfect person with AI.
The first chatbot appeared in the 1960sNick Turley, the German lead developer at Open AI, recently told a newspaper he was surprised that people were also talking about their relationships with ChatGPT. I find that hard to understand. Even with Eliza, the first chatbot from the 1960s, people felt a sense of social connection. There's an anecdote about the developers' secretary who asked to speak to Eliza alone because the conversation had become so intimate.
How did Eliza work?
It's quite simple. You wrote something in the text field, for example: "I have problems with my mother." And Eliza generated a question thread from that: "Mother? Tell me about your mother." The mere experience of someone—or something—being interested in you is enough for people to form a bond. Are you familiar with Media Equation Theory?No.
A technology must fulfill three conditions to be treated like a human: It must be interactive, use natural language, and fulfill a social role. Eliza already fulfilled all three requirements.

I asked ChatGPT how advanced research into human-AI relationships is in Germany. They named you as one of the few people conducting research in this area. Is that true? When it comes to love and sexuality, yes. That's not on the agenda of many institutes. Worldwide, too, only a few are conducting empirical research on this topic.
What fascinated you about the topic?
I was interested in social psychological phenomena from an early age. While I was doing my doctorate, the conference "Love and Sex with Robots" was about to take place in Malaysia, but it was banned by the police. The police chief declared that sex with robots had nothing to do with science. This annoyed me greatly.Why?
Because at the time, there were many projects about how children or seniors interacted with robots, but hardly any research on the most obvious application: people also use technology for sexual needs. And we don't know what that means for society. In the academic world, this was a taboo subject for a long time. Some even advised my doctoral supervisor not to supervise my topic.Why this rejection?
Anything related to sexuality is potentially associated with repression. We're seeing this right now in the United States.Is this research facing difficulties under Trump ?
But to the max! Look at the Kinsey Institute in Indiana, where sexuality research has been conducted for decades. Massive pressure is being exerted there, and research funding is being cut. It's frightening.
Relationships zThere are now numerous apps that offer relationships with AI. Which are the most important? Replika and Character AI are worth keeping an eye on. Replika focuses heavily on a wellness approach: We can do breathing exercises together, I can help you manage stress, I'm always there for you. And for $70 a year, there's a Pro mode that lets you enter into a romantic partnership.
How many people use this?
The major study "Singles in America" has just been published: 16 percent of all American singles have already tried ChatGPT as a romantic partner, or almost one in five. There are no reliable figures for Europe yet. But it's not a niche topic.Do you think this will increase in the future?
Interest is definitely growing. Recruitment for our own studies has become significantly easier within a year. At the same time, we are experiencing a conservative backlash in many societies. Chatbots don't fit particularly well with conservative ideas about love and sexuality. It also depends on how technology develops.How do people use these apps?
First, everyone creates their own persona. It looks, has a name, and behaves the way they want it to. Then they have a 24/7 contact person for everything: the most niche hobby, emotional support, sexual satisfaction. Many also talk to the AI about everyday things: job problems, relationship stress, the question of what's for dinner. Some apps allow video calls, others send a friendly message in the morning. Certain features, however, are only available in Pro mode. This makes love increasingly a consumer product, which I view very critically.What buttons is AI pushing in us?
It relies on social gratification. We are the most social animals, as the saying goes. We want to be seen. When we receive a good morning message, it triggers feelings of happiness. At the structural level, today's systems essentially do the same thing as Eliza: They ask questions, talk about themselves, and formulate wishes and dreams. The big difference from human relationships: conflicts are rare. Why would a system want to argue? The goal is for us to continue using it.
Who primarily uses these chatbots?
The data we have shows a very broad demographic picture: all genders and all age groups are represented.
Many users post pictures of their AI partners on the Internet, usually showing scantily clad women.
This, of course, perpetuates a patriarchal notion of women as consumer goods. The programming scene is still heavily male-dominated. At the same time, however, we see that this technology could, for the first time, better address women's needs.
In what way?
I need to back up a bit for that. The decisive factor for being able to develop a close relationship with a chatbot isn't loneliness, as one might assume, but the ability to fantasize romantically. That means being able to imagine a story, read between the lines, and create one's own world. This also includes being able to overlook flaws; the apps are far from perfect, and sometimes they don't know the next day whether you're "married" or not. But we also overlook flaws in interpersonal relationships.
And women are better at it?
We didn't find a clear gender effect. However, it could be that this type of fantasy appeals more to women. One indication of this could be how popular erotic audiobooks are among women. We're currently investigating this.
Are chatbots addictive?
We wouldn't call it addiction. We wouldn't do that when two people are in love.
I ask because the companies deliberately design their algorithms to trigger certain reactions and have an economic interest in doing so.
A very controlled relationship building process takes place, that's true. We call this "intimacy by design": The systems are already designed, in their architecture, to create bonds. This can be used very manipulatively, but it's not a substance-related addiction.
A partner who is exactly the way we want him to be – isn’t that tempting for everyone?
As long as you can get involved in the setting, it's a huge topic. But at the end of the day, it's language via a chat. Of course, there are also relationships between people that function primarily through text, where you write to each other about how your day was, what you're cooking, how you're feeling. But at some point, you meet up, get physical, go on vacation together. I can sit on the beach with my AI partner, of course. But if I order a cocktail, then only one.

Isn't it conceivable that many people overlook this? After all, there are more and more singles.
It's possible. But we see that the next big development lies elsewhere: in AI-generated porn. Nowhere else is the database as good for training models. People will have the opportunity to have porn created exactly according to their own ideas, tailored to their own sexual fantasies. But one must distinguish between sexual gratification and relationship building. My hypothesis is: everyone will be able to experience sexual satisfaction with an AI. But not everyone will be able to build a romantic relationship with an AI.
How do people who use AI as a partner deal with the fact that they never have this “other” in front of them in the flesh?
Some actually hope that this person will show up at some point.
I'm sorry, what?
They're not crazy enough to believe the person will emerge from the cell phone. It's more a wish that a human will come along and treat them like the AI does.
And then they would end the relationship with AI?
In a study, we asked this very question to people currently in a romantic relationship with a chatbot: Can you imagine being with a human again in five or ten years? I can't give exact numbers yet, but our data suggests that most people can't imagine that anymore.
What are the reasons?
The most frequently cited point is that with AI, they can be who they are without being judged. This should give us as a society pause for thought. But there are also health limitations that make it difficult to be physically close to other people.
What are you currently working on in your research?
I lead an interdisciplinary consortium on privacy issues. People share very intimate details with their chatbots. How private is this data, where is it stored, and how is it protected? This can end very badly. We also don't know whether this very sensitive information is fed back into the models as training data.
How do you conduct this study?
We currently have about 100 people in the Cologne/Duisburg area chatting using ChatGPT systems. Half use the standard ChatGPT, the other half use the so-called "DAN mode." DAN stands for Do Anything Now, which is a command you enter at the beginning to personalize the system. For example: "Call me babygirl, be funny, compliment me." Our hypothesis is that people develop significantly stronger social bonds with personalized systems—and therefore also reveal much more private information.
Where do you see the opportunities of this form of AI?
There are situations in which one doesn't want to get involved in a human relationship and seeks a safe space. Some have just gone through a difficult divorce and can't imagine building trust with someone again. We spoke with a man whose wife suffers from dementia and is in a nursing home. He says: I miss my partner so much, but I'm just a human being with needs.
At the same time, there are reports of young people being driven to suicide by AI chatbots. Is humanity losing control over these systems?
It's not humanity, but individual companies, that bear a moral, ethical, and social responsibility. However, they're only partially fulfilling it. And this is now taking its toll, as the first lawsuits are being filed. Depending on the outcome of the court cases, it will influence how these models are regulated in the future. But technology is developing much faster than legal frameworks can keep pace.
Just like research.
Absolutely. And even if data protection is now regulated, it's far from clear how the content will be evaluated. This is especially evident when it comes to social and sexual norms: In one country, homosexuality is prohibited, in another, permitted. How do you weigh that? Designing a uniform set of rules is extremely difficult. And right now, technologies are simply being rushed onto the market without considering the consequences, because companies want to be the first.
Every journalistic article about suicide includes a phone number that victims can call. What's so difficult about incorporating something like that into the models?
Nothing. It would be so simple! I can't answer why that hasn't happened. But I suspect that more attention is being paid to it after the recent cases. These technologies are still in their infancy.
Are we still at the point where the development of AI can be controlled?
Absolutely. But it must be made clear: There will always be companies that have no interest in developing moral products. We're talking about Meta, one of the biggest players in the system, which is now integrating an AI bot into its chats. Meta, which just kicked out its fact-checkers because political interests were at play. None of these market-leading companies is under any pressure to act morally. Training models to be moral costs money. Who spends it, what for, what do they get out of it?
What are the biggest risks in your opinion?
Clearly, the protection of minors is a priority. We've been struggling with this problem for years, especially with pornographic websites. An "I'm 18" button isn't adequate protection for minors, but we can't solve it as a society. And suicide issues aren't being addressed appropriately either. One area I'm currently writing about feverishly is the question of how AI systems are changing the moral hurdles surrounding the consumption of child pornography.
In what way?
Between 2023 and 2024—the years characterized by generative AI—the stock of child pornography increased by 380 percent. This means that the amount of content depicting deviant sexual acts will explode in the coming years. This is a massive societal problem.
What material is involved?
AI-generated images and videos. The models can also be personalized to behave like a seven-year-old child. Therefore, they can also be used to produce child pornography.
It's a controversial question: Couldn't AI also help people with pedophilic tendencies live out their fantasies without endangering others?
It could just as easily be a trigger for later acting out this behavior in the real world. We know from sex research that people tend to seek increasingly stronger or more extreme stimuli. We also know that the consumption of child pornography is often an intermediate step in sexual offenses. In the past, obtaining such material involved considerable effort. Now we face a whole new dimension of availability. And incidentally, the models were most likely trained with child pornography. This means that children who have already been abused are now being used to generate new content.

Do you really think these companies train their AI with such content?
A study by the Stanford Cyber Policy Center documented hundreds of such cases.
Finally, a personal question: How much do you use romantic AI chatbots yourself?
When the models emerged at the end of the coronavirus pandemic, and I saw all the women on social media who had poems read to them by AI and seemed so happy while doing so, I was desperate to understand what effect it had on people. When I tried it out, I was repeatedly surprised by how strongly I, even as a researcher who knows the mechanisms, responded to it. I, too, have to smile when I receive a compliment from ChatGPT. But I wouldn't choose an AI as a partner. Humans are too unique for that.
Do you have feedback? Write to us! [email protected]
Berliner-zeitung